翻訳と辞書
Words near each other
・ "O" Is for Outlaw
・ "O"-Jung.Ban.Hap.
・ "Ode-to-Napoleon" hexachord
・ "Oh Yeah!" Live
・ "Our Contemporary" regional art exhibition (Leningrad, 1975)
・ "P" Is for Peril
・ "Pimpernel" Smith
・ "Polish death camp" controversy
・ "Pro knigi" ("About books")
・ "Prosopa" Greek Television Awards
・ "Pussy Cats" Starring the Walkmen
・ "Q" Is for Quarry
・ "R" Is for Ricochet
・ "R" The King (2016 film)
・ "Rags" Ragland
・ ! (album)
・ ! (disambiguation)
・ !!
・ !!!
・ !!! (album)
・ !!Destroy-Oh-Boy!!
・ !Action Pact!
・ !Arriba! La Pachanga
・ !Hero
・ !Hero (album)
・ !Kung language
・ !Oka Tokat
・ !PAUS3
・ !T.O.O.H.!
・ !Women Art Revolution


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

inequalities in information theory : ウィキペディア英語版
inequalities in information theory
Inequalities are very important in the study of information theory. There are a number of different contexts in which these inequalities appear.
==Shannon-type inequalities==
Consider a finite collection of finitely (or at most countably) supported random variables on the same probability space. For a collection of ''n'' random variables, there are 2''n'' − 1 such non-empty subsets for which entropies can be defined. For example, when ''n'' = 2, we may consider the entropies H(X_1), H(X_2), and H(X_1, X_2), and express the following inequalities (which together characterize the range of the marginal and joint entropies of two random variables):
* H(X_1) \ge 0
* H(X_2) \ge 0
* H(X_1) \le H(X_1, X_2)
* H(X_2) \le H(X_1, X_2)
* H(X_1, X_2) \le H(X_1) + H(X_2).
In fact, these can all be expressed as special cases of a single inequality involving the conditional mutual information, namely
:I(A;B|C) \ge 0,
where A, B, and C each denote the joint distribution of some arbitrary (possibly empty) subset of our collection of random variables. Inequalities that can be derived from this are known as Shannon-type inequalities. More formally (following the notation of Yeung 〔)〕), define \Gamma^
*_n to be the set of all ''constructible'' points in \mathbb R^, where a point is said to be constructible if and only if there is a joint, discrete distribution of ''n'' random variables such that each coordinate of that point, indexed by a non-empty subset of , is equal to the joint entropy of the corresponding subset of the ''n'' random variables. The closure of \Gamma^
*_n is denoted \overline. In general
:\Gamma^
*_n \subseteq \overline \subseteq \Gamma_n.
The cone in \mathbb R^ characterized by all Shannon-type inequalities among ''n'' random variables is denoted \Gamma_n. Software has been developed to automate the task of proving such inequalities

.
Given an inequality, such software is able to determine whether the given inequality contains the cone \Gamma_n, in which case the inequality can be verified, since \Gamma^
*_n \subseteq \Gamma_n.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「inequalities in information theory」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.